Application of Recursive Least Squares to Efficient Blunder Detection in Linear Models

نویسندگان

چکیده مقاله:

In many geodetic applications a large number of observations are being measured to estimate the unknown parameters. The unbiasedness property of the estimated parameters is only ensured if there is no bias (e.g. systematic effect) or falsifying observations, which are also known as outliers. One of the most important steps towards obtaining a coherent analysis for the parameter estimation is the detection and elimination of outliers, which may appear to be inconsistent with the remainder of the observations or the model. Outlier detection is thus a primary step in many geodetic applications. There are various methods in handling the outlying observations among which a sequential data snooping procedure, known as Detection, Identification and Adaptation (DIA) algorithm, is employed in the present contribution. An efficient data snooping procedure is based on the Baarda’s theory in which blunders are detected element-wise and the model is adopted in an iterative manner. This method may become computationally expensive when there exists a large number of blunders in the observations. An attempt is made to optimize this commonly used method for outlier detection. The optimization is performed to improve the computational time and complexity of the conventional method. An equivalent formulation is thus presented in order to simplify the elimination of outliers from an estimation set-up in a linear model. The method becomes more efficient when there is a large number of model parameters involved in the inversion. In the conventional method this leads to a large normal matrix to be inverted in a consecutive manner. Based on the recursive least squares method, the normal matrix inversion is avoided in the presented algorithm. The accuracy and performance of the proposed formulation is validated based on the results of two real data sets. The application of this formulation has no numerical impact on the final result and it is identical to the conventional outlier elimination. The method is also tested in a simulation case to investigate the accuracy of the outlier detection method in critical cases when large amount of the data is contaminated. In the application considered, it is shown that the proposed algorithm is faster than the conventional method by at least a factor of 3. The method becomes faster when the number of observations and parameters increases. 

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Recursive least squares with linear constraints

Recursive Least Squares (RLS) algorithms have wide-spread applications in many areas, such as real-time signal processing, control and communications. This paper shows that the unique solutions to linear-equality constrained and the unconstrained LS problems, respectively, always have exactly the same recursive form. Their only difference lies in the initial values. Based on this, a recursive a...

متن کامل

Kernel Recursive Least Squares

We present a non-linear kernel-based version of the Recursive Least Squares (RLS) algorithm. Our Kernel-RLS algorithm performs linear regression in the feature space induced by a Mercer kernel, and can therefore be used to recursively construct the minimum meansquared-error regressor. Sparsity (and therefore regularization) of the solution is achieved by an explicit greedy sparsification proces...

متن کامل

Recursive Least Squares Estimation

We start with estimation of a constant based on several noisy measurements. Suppose we have a resistor but do not know its resistance. So we measure it several times using a cheap (and noisy) multimeter. How do we come up with a good estimate of the resistance based on these noisy measurements? More formally, suppose x = (x1, x2, . . . , xn) T is a constant but unknown vector, and y = (y1, y2, ...

متن کامل

Efficient Reinforcement Learning Using Recursive Least-Squares Methods

The recursive least-squares (RLS) algorithm is one of the most well-known algorithms used in adaptive filtering, system identification and adaptive control. Its popularity is mainly due to its fast convergence speed, which is considered to be optimal in practice. In this paper, RLS methods are used to solve reinforcement learning problems, where two new reinforcement learning algorithms using l...

متن کامل

Linear mixed models and penalized least squares

Linear mixed-effects models are an important class of statistical models that are not only used directly in many fields of applications but also used as iterative steps in fitting other types of mixed-effects models, such as generalized linear mixed models. The parameters in these models are typically estimated by maximum likelihood (ML) or restricted maximum likelihood (REML). In general there...

متن کامل

Hierarchic Kernel Recursive Least-Squares

We present a new hierarchic kernel based modeling technique for modeling evenly distributed multidimensional datasets that does not rely on input space sparsification. The presented method reorganizes the typical single-layer kernel based model in a hierarchical structure, such that the weights of a kernel model over each dimension are modeled over the adjacent dimension. We show that the impos...

متن کامل

منابع من

با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ذخیره در منابع من قبلا به منابع من ذحیره شده

{@ msg_add @}


عنوان ژورنال

دوره 5  شماره 2

صفحات  258- 267

تاریخ انتشار 2015-11

با دنبال کردن یک ژورنال هنگامی که شماره جدید این ژورنال منتشر می شود به شما از طریق ایمیل اطلاع داده می شود.

کلمات کلیدی

میزبانی شده توسط پلتفرم ابری doprax.com

copyright © 2015-2023